Low-Complexity Codes for Random and Clustered High-Order Failures in Storage Arrays
نویسنده
چکیده
RC (Random/Clustered) codes are a new efficient array-code family for recovering from 4-erasures. RC codes correct most 4-erasures, and essentially all 4-erasures that are clustered. Clustered erasures are introduced as a new erasure model for storage arrays. This model draws its motivation from correlated device failures, that are caused by physical proximity of devices, or by age proximity of endurance-limited solid-state drives. The reliability of storage arrays that employ RC codes is analyzed and compared to known codes. The new RC code is significantly more efficient, in all practical implementation factors, than the best known 4-erasure correcting MDS code. These factors include: small-write update-complexity, full-device update-complexity, decoding complexity and number of supported devices in the array.
منابع مشابه
SCAN: An Efficient Sector Failure Recovery Algorithm for RAID-6 Codes
Recent studies show disks fail much more often in real systems than specified in their data-sheets and RAID-5 may not be able to provide needed reliability for practical systems. It is desirable to have disk arrays and clustered storage systems with higher data redundancy, such as RAID-6. Meanwhile, latest research also indicates disk sector failures occur much more often than whole disk failur...
متن کاملLDPC Codes for Two-Dimensional Arrays
Binary codes over two-dimensional arrays are very useful in data storage, where each array column represents a storage device or unit that may suffer failure. In this paper we propose a new framework for probabilistic construction of codes on two-dimensional arrays. Instead of a pure combinatorial erasure model used in traditional array codes, we propose a mixed combinatorial-probabilistic mode...
متن کاملSearch Based Weighted Multi-Bit Flipping Algorithm for High-Performance Low-Complexity Decoding of LDPC Codes
In this paper, two new hybrid algorithms are proposed for decoding Low Density Parity Check (LDPC) codes. Original version of the proposed algorithms named Search Based Weighted Multi Bit Flipping (SWMBF). The main idea of these algorithms is flipping variable multi bits in each iteration, change in which leads to the syndrome vector with least hamming weight. To achieve this, the proposed algo...
متن کاملSearch Based Weighted Multi-Bit Flipping Algorithm for High-Performance Low-Complexity Decoding of LDPC Codes
In this paper, two new hybrid algorithms are proposed for decoding Low Density Parity Check (LDPC) codes. Original version of the proposed algorithms named Search Based Weighted Multi Bit Flipping (SWMBF). The main idea of these algorithms is flipping variable multi bits in each iteration, change in which leads to the syndrome vector with least hamming weight. To achieve this, the proposed algo...
متن کاملTriple-star: a Coding Scheme with Optimal Encoding Complexity for Tolerating Triple Disk Failures in Raid
Low encoding/decoding complexity is essential for practical storage systems. This paper presents a new Maximum Distance Separable (MDS) array codes, called Triple-Star, for tolerating triple disk failures in Redundant Arrays of Inexpensive Disks (RAID) architecture. Triple-Star is an extension of the double-erasure-correcting Rotarycode and a modification of the generalized triple-erasure-corre...
متن کامل